Distance-based and continuum Fano inequalities with applications to statistical estimation

نویسندگان

  • John C. Duchi
  • Martin J. Wainwright
چکیده

In this technical note, we give two extensions of the classical Fano inequality in information theory. The first extends Fano’s inequality to the setting of estimation, providing lower bounds on the probability that an estimator of a discrete quantity is within some distance t of the quantity. The second inequality extends our bound to a continuum setting and provides a volume-based bound. We illustrate how these inequalities lead to direct and simple proofs of several statistical minimax lower bounds.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uniformly Consistent Empirical Likelihood Estimation Subject to a Continuum of Unconditional Moment Inequalities

This paper extends moment-based estimation procedures to statistical models defined by a continuum of unconditional moment inequalities. The underlying probability distribution in the model is the (infinite dimensional) parameter of interest. For a general class of estimating functions that indexes the continuum of moments, we develop the estimation theory of this parameter using the method of ...

متن کامل

Parameter Estimation of Some Archimedean Copulas Based on Minimum Cramér-von-Mises Distance

The purpose of this paper is to introduce a new estimation method for estimating the Archimedean copula dependence parameter in the non-parametric setting. The estimation of the dependence parameter has been selected as the value that minimizes the Cramér-von-Mises distance which measures the distance between Empirical Bernstein Kendall distribution function and true Kendall distribution functi...

متن کامل

Moment Inequalities for Supremum of Empirical Processes of‎ ‎U-Statistic Structure and Application to Density Estimation

We derive moment inequalities for the supremum of empirical processes of U-Statistic structure and give application to kernel type density  estimation ‎and estimation of the distribution function for functions of observations.  

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Local Privacy, Data Processing Inequalities, and Statistical Minimax Rates

Working under a model of privacy in which data remains private even from the statistician, we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees. When combined with standard minimax techniques...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1311.2669  شماره 

صفحات  -

تاریخ انتشار 2013